Web Survey Bibliography
Do survey respondents, recruited with extraordinary efforts, provide answers of lower quality than respondents who are recruited more easily? This question has worried survey practitioners and analysts alike for at least four decades (Cannell and Fowler 1963; Robins 1963), but an answer is not known. Although no direct relationship exists between response rates and nonresponse bias (Groves and Peytcheva 2008), an open question is the relationship between efforts to increase response rates and other sources of survey error, in particular, measurement error. The common hypothesis is that those who require greater recruitment effort provide answers of lower quality than those who are recruited more readily. A latent cooperation continuum often is posited (e.g., (Mason, Lesser, and Traugott 2002; Cannell and Fowler 1963), such that those who are most difficult to recruit to the sample pool have the lowest motivation and are thus the worst reporters. However, it is not clear if this is generally the case.
This paper reviews existing literature on the relationship between the levels of effort exerted for sample member recruitment and data quality, with a primary focus on item nonresponse. Two methods – a quantitative meta-analysis and a systematic qualitative review – are used to examine 44 articles examining the relationship between levels of recruitment effort and measurement error. These studies use five different measures of levels of effort (number of contact attempts/follow-up reminders, refusal conversion, date of interview, combination of these three, estimated response propensity) and look at multiple measurement error indicators (e.g., item nonresponse, response accuracy, signed deviations, scale reliability, acquiescence, non-differentiation).
This review asks the following four questions.
1.Do respondents recruited with more effort have higher item nonresponse rates than respondents recruited more easily?
2.Are particular study characteristics associated with higher item nonresponse rates among respondents recruited with more effort relative to respondents recruited with less effort?
3.Are particular types of items associated with higher item nonresponse rates among respondents recruited with more effort relative to respondents recruited with less effort?
4. Is there consistent evidence about higher or lower levels of other types of measurement error, and does it vary by the level of effort measure?
Conference homepage (abstract)/(full text)
Web survey bibliography (317)
- Overview: Online Surveys; 2017; Vehovar, V.; Lozar Manfreda, K.
- Respondent mode choice in a smartphone survey ; 2017; Conrad, F. G., Schober, M. F., Antoun, C., Yan, H. Y., Hupp, A., Johnston, M., Ehlen, P., Vickers, L...
- Collecting Data from mHealth Users via SMS Surveys: A Case Study in Kenya; 2016; Johnson, D.
- Electronic and paper based data collection methods in library and information science research: A comparative...; 2016; Tella, A.
- Stable Relationships, Stable Participation? The Effects of Partnership Dissolution and Changes in Relationship...; 2016; Mueller, B.; Castiglioni, L.
- Identifying Pertinent Variables for Nonresponse Follow-Up Surveys. Lessons Learned from 4 Cases in Switzerland...; 2016; Vandenplas, C.; Joye, D.; Staehli, M. E.; Pollien, A.
- The 2013 Census Test: Piloting Methods to Reduce 2020 Census Costs; 2016; Walejko, G. K.; Miller, P. V.
- The Validity of Surveys: Online and Offline; 2016; Wiersma, W.
- Methods can matter: Where Web surveys produce different results than phone interviews; 2016; Keeter, S.
- Do Polls Still Work If People Don't Answer Their Phones?; 2016; Edwards-Levy, A.; Jackson, N. M.
- HUFFPOLLSTER: Why Reaching Latinos Is A Challenge For Pollsters; 2016; Jackson, N. M.; Edwards-Levy, A.; Velencia, J.
- Comprehension and engagement in survey interviews with virtual agents; 2016; Conrad, F. G.; Schober, M. F.; Jans, M.; Orlowski, R. A.; Nielsen, D.; Levenstein, R. M.
- An Overview of Mobile CATI Issues in Europe; 2015; Slavec, A.; Toninelli, D.
- Using Mobile Phones for High-Frequency Data Collection; 2015; Azevedo, J. P.; Ballivian, A.; Durbin, W.
- Mixed mode surveys ; 2015; Burton, J.
- Two Are Better Than One: The Use of a Mixed-Mode Data Collection to Improve the Electoral Forecast; 2014; de Rada, V. D., Pasadas del Amo, S.
- The impact of contact effort on mode-specific selection and measurement bias; 2014; Schouten, B., van der Laan, J., Cobben, F.
- How much is shorter CAWI questionnaire VS CATI questionnaire?; 2014; Bartoli, B.
- Advantages of a global multimodal print & digital readership survey; 2013; Cour, N., Saint-Joanis, G.
- Relative Mode Effects on Data Quality in Mixed-Mode Surveys by an Instrumental Variable; 2013; Vannieuwenhuyze, J. T. A., Revilla, M.
- A report on the Confirmit Market Research Software Survey 2013; 2013; Macer, T., Wilson, S.
- Mode effect analysis and adjustment in a split-sample mixed-mode Web/CATI survey; 2013; Kolenikov, S., Kennedy, C.
- Evaluating the left‐right dimension: Category Selection Probing conducted in an online access...; 2013; Huefken , V.
- Methodological, legal and technical perspectives on the feasibility of web survey paradata in German...; 2013; Sattelberger, S.
- Impact of mode design on reliability in longitudinal data; 2013; Cernat, A.
- Exploring patterns of academic usage: A Google Scholar based study of ESS, EVS, WVS and ISSP academic...; 2013; Malnar, B.
- Web questionnaires in official population surveys: Do's and don'ts First experiments and impacts...; 2013; Blanke, K.
- Mode effects in Labour Force Surveys - do they really matter?; 2013; Koerner, T.
- Measuring the same concepts in several modes in the "BIBB/BAuA-Employee-Survey 2011/12" ; 2013; Gensicke, M., Tschersich, N., Hartmann, J.
- What works? Getting the General Population To Go Online in a Mixed Mode Local Health Survey; 2013; Frigault, L.-R., Azzou, S. A. K., Molloy, E. J. K., Ammarguellat, F., Couture, M., Gratton, J.
- Using Technology to Conduct Questionnaire Evaluations with Hard to Reach Populations ; 2013; Ridolfo, H., Ott, K.
- Mode Effects in a National Establishment Survey; 2013; Daley, K., Phillips, B. T.
- Evaluating the Effect of a Non-Monetary Incentive in a Nationally Representative Mixed-Mode Establishment...; 2013; Sengupta, M., Harris-Kojetin, L., Hobbs, M., Greene, A.
- Survey Reminder Method Experiment: An Examination of Cost Efficiency and Reminder Mode Salience in the...; 2013; Anderson, M., Rogers, B., CyBulski, K., Hall, J. W., Alderks, C. E., Milazzo-Sayre, L.
- Experiences from a probability-based Internet panel: Sample, recruitment and participation; 2013; Scherpenzeel, A.
- An Evaluation of Internet Versus Paper-based Methods for Public Participation Geographic Information...; 2012; Pocewicz, A.; Nielsen-Pincus, M.; Brown, G.; Schnitzer, R.
- Using paradata to explore item-level response times in surveys; 2012; Couper, M. P., Kreuter, F.
- Specialized Tools for Measuring Past Events ; 2012; Belli, R. F.
- Modes of Data Collection; 2012; Tourangeau, R.
- Mode and non-response effects and their treatment; 2012; Chrysanthopoulos, S., Georgostathi, A.
- “I think I know what you did last summer” Improving data quality in panel surveys; 2012; Lugtig, P. J.
- Using Text-to-Speech (TTS) for Audio-CASI; 2012; Couper, M. P., Kirgis, N., Buageila, S., Berglund, P.
- Does Mode Matter? Initial Evidence from the German Longitudinal Election Study (GLES); 2012; Blumenstiel, J. E., Rossmann, J.
- The Representativity of Web Surveys of the General Population compared to Traditional Modes and Mixed...; 2012; Klausch, L. T., Schouten, B., Hox, J.
- Effects of speeding on satisficing in Mixed-Mode Surveys; 2011; Bathelt, S., Bauknecht, J.
- Web based CATI on Amazon Elastic Compute Cloud and VirtualBox using queXS; 2011; Zammit, A.
- Web/Cloud Based CATI Using queXS; 2011; Zammit, A.
- When Referring to Mode, Is Expressed Preference the Same as Reality?; 2011; Denk, K.
- Three Era's of Survey Research; 2011; Groves, R. M.
- Testing a single mode vs a mixed mode design; 2011; Laaksonen, S.